Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Free, publicly-accessible full text available May 1, 2026
- 
            Free, publicly-accessible full text available May 3, 2026
- 
            Free, publicly-accessible full text available May 3, 2026
- 
            Free, publicly-accessible full text available May 3, 2026
- 
            International Ocean Discovery Program (IODP) Expedition 399 collected new cores from the Atlantis Massif (30°N; Mid-Atlantic Ridge), an oceanic core complex that hosts the Lost City hydrothermal field (LCHF). Studies of the Atlantis Massif and the LCHF have transformed our understanding of tectonic, magmatic, hydrothermal, and microbial processes at slow-spreading ridges. The Atlantis Massif was the site of four previous expeditions (Integrated Ocean Drilling Program Expeditions 304, 305, and 340T and IODP Expedition 357) and numerous dredging and submersible expeditions. The deepest IODP hole in young (<2 My) oceanic lithosphere, Hole U1309D, was drilled ~5 km north of the LCHF and reached 1415 meters below seafloor (mbsf) through a series of primitive gabbroic rocks. A series of 17 shallow (<16.4 mbsf) holes were also drilled at 9 sites across the south wall of the massif during Expedition 357, recovering heterogeneous rock types including hydrothermally altered peridotites, gabbroic, and basaltic rocks. The hydrologic regime differs between the two locations, with a low-permeability conductive regime in Hole U1309D and a high- (and possibly deep-reaching) permeability regime along the southern wall. Expedition 399 targeted Hole U1309D and the southern wall area to collect new data on ancient processes during deformation and alteration of detachment fault rocks. The recovered rocks and fluids are providing new insights into past and ongoing water-rock interactions, processes of mantle partial melting and gabbro emplacement, deformation over a range of temperatures, abiotic organic synthesis reactions, and the extent and diversity of life in the subseafloor in an actively serpentinizing system. We sampled fluids and measured temperature in Hole U1309D before deepening it to 1498 mbsf. The thermal structure was very similar to that measured during Expedition 340T, and lithologies were comparable to those found previously in Hole U1309D. A significant zone of cataclasis and alteration was found at 1451–1474 mbsf. A new Hole U1601C (proposed Site AMDH-02A) was drilled on the southern ridge close to Expedition 357 Hole M0069A, where both deformed and undeformed serpentinites had previously been recovered. Rapid drilling rates achieved a total depth of 1267.8 mbsf through predominantly ultramafic (68%) and gabbroic (32%) rocks, far surpassing the previous drilling record in a peridotite-dominated system of 201 m. Recovery was excellent overall (71%) but particularly high in peridotite-dominated sections where recovery regularly exceeded 90%. The recovery of sizable sections of largely intact material will provide robust constraints on the architecture and composition of the oceanic mantle lithosphere. The deepest portions of the newly drilled borehole may be beyond the known limits of life, providing the means to assess the role of biological activity across the transition from a biotic to an abiotic regime. Borehole fluids from both holes were collected using both the Kuster Flow-Through Sampler and the new Multi-Temperature Fluid Sampler. Wireline logging in Hole U1601C provided information on downhole density and resistivity, imaged structural features, and documented fracture orientations. A reentry system was installed in Hole U1601C, and both it and Hole U1309D were left open for future deep drilling, fluid sampling, and potential borehole observatories.more » « lessFree, publicly-accessible full text available May 3, 2026
- 
            Abstract When the scientific dataset evolves or is reused in workflows creating derived datasets, the integrity of the dataset with its metadata information, including provenance, needs to be securely preserved while providing assurances that they are not accidentally or maliciously altered during the process. Providing a secure method to efficiently share and verify the data as well as metadata is essential for the reuse of the scientific data. The National Science Foundation (NSF) funded Open Science Chain (OSC) utilizes consortium blockchain to provide a cyberinfrastructure solution to maintain integrity of the provenance metadata for published datasets and provides a way to perform independent verification of the dataset while promoting reuse and reproducibility. The NSF- and National Institutes of Health (NIH)-funded Neuroscience Gateway (NSG) provides a freely available web portal that allows neuroscience researchers to execute computational data analysis pipeline on high performance computing resources. Combined, the OSC and NSG platforms form an efficient, integrated framework to automatically and securely preserve and verify the integrity of the artifacts used in research workflows while using the NSG platform. This paper presents the results of the first study that integrates OSC–NSG frameworks to track the provenance of neurophysiological signal data analysis to study brain network dynamics using the Neuro-Integrative Connectivity tool, which is deployed in the NSG platform. Database URL: https://www.opensciencechain.org.more » « less
- 
            Neutrino-nucleus cross section measurements are needed to improve interaction modeling to meet the precision needs of neutrino experiments in efforts to measure oscillation parameters and search for physics beyond the Standard Model. We review the difficulties associated with modeling neutrino-nucleus interactions that lead to a dependence on event generators in oscillation analyses and cross section measurements alike. We then describe data-driven model validation techniques intended to address this model dependence. The method relies on utilizing various goodness-of-fit tests and the correlations between different observables and channels to probe the model for defects in the phase space relevant for the desired analysis. These techniques shed light on relevant mismodeling, allowing it to be detected before it begins to bias the cross section results. We compare more commonly used model validation methods which directly validate the model against alternative ones to these data-driven techniques and show their efficacy with fake data studies. These studies demonstrate that employing data-driven model validation in cross section measurements represents a reliable strategy to produce robust results that will stimulate the desired improvements to interaction modeling. Published by the American Physical Society2025more » « lessFree, publicly-accessible full text available May 1, 2026
- 
            Large neutrino liquid argon time projection chamber (LArTPC) experiments can broaden their physics reach by reconstructing and interpreting MeV-scale energy depositions, or blips, present in their data. We demonstrate new calorimetric and particle discrimination capabilities at the MeV energy scale using reconstructed blips in data from the MicroBooNE LArTPC at Fermilab. We observe a concentration of low-energy ( ) blips around fiberglass mechanical support struts along the time projection chamber edges with energy spectrum features consistent with the Compton edge of 2.614 MeV decay rays. These features are used to verify proper calibration of electron energy scales in MicroBooNE’s data to few percent precision and to measure the specific activity of in the fiberglass composing these struts, . Cosmogenically produced blips above 3 MeV in reconstructed energy are used to showcase the ability of large LArTPCs to distinguish between low-energy proton and electron energy depositions. An enriched sample of low-energy protons selected using this new particle discrimination technique is found to be smaller in data than in dedicated cosmic-ray simulations, suggesting either incorrect modeling of incident cosmic fluxes or particle transport modeling issues in eant4. Published by the American Physical Society2025more » « lessFree, publicly-accessible full text available February 1, 2026
- 
            Train Big, Then Compress: Rethinking Model Size for Efficient Training and Inference of TransformersIII, Hal Daumé (Ed.)Since hardware resources are limited, the objective of training deep learning models is typically to maximize accuracy subject to the time and memory constraints of training and inference. We study the impact of model size in this setting, focusing on Transformer models for NLP tasks that are limited by compute: self-supervised pretraining and high-resource machine translation. We first show that even though smaller Transformer models execute faster per iteration, wider and deeper models converge in significantly fewer steps. Moreover, this acceleration in convergence typically outpaces the additional computational overhead of using larger models. Therefore, the most compute-efficient training strategy is to counterintuitively train extremely large models but stop after a small number of iterations. This leads to an apparent trade-off between the training efficiency of large Transformer models and the inference efficiency of small Transformer models. However, we show that large models are more robust to compression techniques such as quantization and pruning than small models. Consequently, one can get the best of both worlds: heavily compressed, large models achieve higher accuracy than lightly compressed, small models.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                     Full Text Available
                                                Full Text Available